189 research outputs found

    Interior Point Methods for Massive Support Vector Machines

    Get PDF
    We investigate the use of interior point methods for solving quadratic programming problems with a small number of linear constraints where the quadratic term consists of a low-rank update to a positive semi-de nite matrix. Several formulations of the support vector machine t into this category. An interesting feature of these particular problems is the vol- ume of data, which can lead to quadratic programs with between 10 and 100 million variables and a dense Q matrix. We use OOQP, an object- oriented interior point code, to solve these problem because it allows us to easily tailor the required linear algebra to the application. Our linear algebra implementation uses a proximal point modi cation to the under- lying algorithm, and exploits the Sherman-Morrison-Woodbury formula and the Schur complement to facilitate e cient linear system solution. Since we target massive problems, the data is stored out-of-core and we overlap computation and I/O to reduce overhead. Results are reported for several linear support vector machine formulations demonstrating the reliability and scalability of the method

    A Two-Level Approach to Large Mixed-Integer Programs with Application to Cogeneration in Energy-Efficient Buildings

    Full text link
    We study a two-stage mixed-integer linear program (MILP) with more than 1 million binary variables in the second stage. We develop a two-level approach by constructing a semi-coarse model (coarsened with respect to variables) and a coarse model (coarsened with respect to both variables and constraints). We coarsen binary variables by selecting a small number of pre-specified daily on/off profiles. We aggregate constraints by partitioning them into groups and summing over each group. With an appropriate choice of coarsened profiles, the semi-coarse model is guaranteed to find a feasible solution of the original problem and hence provides an upper bound on the optimal solution. We show that solving a sequence of coarse models converges to the same upper bound with proven finite steps. This is achieved by adding violated constraints to coarse models until all constraints in the semi-coarse model are satisfied. We demonstrate the effectiveness of our approach in cogeneration for buildings. The coarsened models allow us to obtain good approximate solutions at a fraction of the time required by solving the original problem. Extensive numerical experiments show that the two-level approach scales to large problems that are beyond the capacity of state-of-the-art commercial MILP solvers

    "The Procurement of Rarities is a Sign of Peace": Yanagawa Shunsan's Yokohama hanjō ki

    Get PDF
    Author Institution: Randolph-Macon Colleg

    An Optimal Control Model of Technology Transition

    Get PDF
    This paper discusses the use of optimization software to solve an optimal control problem arising in the modeling of technology transition. We set up a series of increasingly complex models with such features as learning-by-doing, adjustment cost, and capital investment. The models are written in continuous time and then discretized by using different methods to transform them into large-scale nonlinear programs. We use a modeling language and numerical optimization methods to solve the optimization problem. Our results are consistent with ndings in the literature and highlight the impact the discretization choice has on the solution and accuracy.

    Theoretical Design and Analysis of Multivolume Digital Assays with Wide Dynamic Range Validated Experimentally with Microfluidic Digital PCR

    Get PDF
    This paper presents a protocol using theoretical methods and free software to design and analyze multivolume digital PCR (MV digital PCR) devices; the theory and software are also applicable to design and analysis of dilution series in digital PCR. MV digital PCR minimizes the total number of wells required for “digital” (single molecule) measurements while maintaining high dynamic range and high resolution. In some examples, multivolume designs with fewer than 200 total wells are predicted to provide dynamic range with 5-fold resolution similar to that of single-volume designs requiring 12 000 wells. Mathematical techniques were utilized and expanded to maximize the information obtained from each experiment and to quantify performance of devices and were experimentally validated using the SlipChip platform. MV digital PCR was demonstrated to perform reliably, and results from wells of different volumes agreed with one another. No artifacts due to different surface-to-volume ratios were observed, and single molecule amplification in volumes ranging from 1 to 125 nL was self-consistent. The device presented here was designed to meet the testing requirements for measuring clinically relevant levels of HIV viral load at the point-of-care (in plasma, 1 000 000 molecules/mL), and the predicted resolution and dynamic range was experimentally validated using a control sequence of DNA. This approach simplifies digital PCR experiments, saves space, and thus enables multiplexing using separate areas for each sample on one chip, and facilitates the development of new high-performance diagnostic tools for resource-limited applications. The theory and software presented here are general and are applicable to designing and analyzing other digital analytical platforms including digital immunoassays and digital bacterial analysis. It is not limited to SlipChip and could also be useful for the design of systems on platforms including valve-based and droplet-based platforms. In a separate publication by Shen et al. (J. Am. Chem. Soc., 2011, DOI: 10.1021/ja2060116), this approach is used to design and test digital RT-PCR devices for quantifying RNA

    Stories in Song : Voice Faculty Recital

    Get PDF
    The talented members of the KSU voice faculty present a special recital with the works of Britten, Schubert, and Schumann. Featured Faculty artists include: Todd Wedge, Heather Witt, Jana Young, and Dr.s Nathan Munson and Eric Jenkins.https://digitalcommons.kennesaw.edu/musicprograms/2307/thumbnail.jp

    The Application of Modern Optimization Packages in Multisensor Data Assimilation

    Get PDF
    PyDDA is an expandable framework that integrates data from weather radars and forecasting models using SciPys optimization package to create meteorological fields

    PyDDA: A New Pythonic Wind Retrieval Package

    Get PDF
    PyDDA (Pythonic Direct Data Assimilation) is a new community framework aimed at wind retrievals that depends only upon utilities in the SciPy ecosystem such as scipy, numpy, and dask. It can support retrievals of winds using information from weather radar networks constrained by high resolution forecast models over grids that cover thousands of kilometers at kilometer-scale resolution. Unlike past wind retrieval packages, this package can be installed using anaconda for easy installation and, with a focus on ease of use can retrieve winds from gridded radar and model data with just a few lines of code. The package is currently available for download at https://github.com/openradar/PyDDA
    corecore